Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
at Exusia

About Exusia
About
Connect with the team
Similar jobs

Job Summary
The Data Analyst is required to Data capture, data cleaning, data preparation, data management and data analysis and interpretations required by the Business. Handle data factory with precision and confidentiality. The role is crucial for providing actionable insights to improve teaching quality, enhance student performance, optimize business operations, and drive growth.
Essential Job Responsibilities:
Academic Data Analysis
· Analyze student performance data from internal assessments, mock tests, and board exams.
· Identify trends in subject-wise performance, batch-wise progress, and dropout patterns.
· Generate reports to assist academic heads in making data-driven interventions.
· Predict outcomes of upcoming competitive exams (JEE/NEET) based on historical data.
Business & Marketing Intelligence
· Monitor enrollment trends, inquiry-to-admission conversions, and campaign ROI.
· Provide insights into market behavior, location-wise performance, and competitor benchmarking.
· Analyze fee structures, discounts, and scholarship schemes for profitability and optimization.
Operational & Centre Efficiency
· Evaluate the performance of branches/centres using KPIs like retention rate, attendance, student satisfaction, and staff productivity.
· Support planning for new center openings with predictive enrollment data and location-based analytics.
Technology & Automation
· Build and maintain dashboards using BI tools (Power BI, Tableau, Excel, Google Data Studio).
· Work closely with IT or CRM teams to ensure data accuracy and integrity across systems.
· Automate regular reporting processes to reduce manual workload.
Strategic Decision Support
· Support senior management with forecasting models, statistical reports, and scenario simulations.
· Present insights in the form of structured reports, visual dashboards, or presentations.
Supervisory Responsibilities
Number of subordinate supervisors reporting to this job
N/A
Total number of employees supervised; include those directly supervised and those supervised through subordinate supervisors
N/A
Job Qualifications:
Education
· Bachelor’s or Master’s degree in Statistics, Mathematics, Engineering, Data Science, Computer Science, or related field.
Experience
· Must be a highly-skilled with 3+ years of experience.
· Demonstrate ability to successfully process all work types
Knowledge / Skills
· Analytical skills & Critical Thinking
· Data Visualization
· Microsoft Excel, VBA, R , Python, Tableau, Power BI
· Efficient in AI powered Data analysis tools
Licenses / Certifications
· Data Analytics desirable
Working Conditions:
Virtual environment, extended length of time in sitting position, medium stress environment
Physical Demands:
N/A
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Responsibilities
-
Deliver full-cycle Tableau development projects, from business needs assessment and data discovery, through solution design, to delivery to client.
-
Enable our clients and ourselves to answer questions and develop data-driven insights through Tableau.
-
Provide technical leadership and support across all aspects of Tableau development and use, from data specification development, through DataMart development, to supporting end-user dashboards and reports.
-
Administrate Tableau Server by creating sites, add/remove users, and provide the appropriate level access for users.
-
Strategize and ideate the solution design. Develop UI mock-ups, storyboards, flow diagrams, conceptual diagrams, wireframes, visual mockups, and interactive prototypes.
-
Develop best practices guidelines for Tableau data processing and visualization. Use these best practices to quickly deliver functionality across the client base and internal users.
Qualifications
-
Degree in a highly-relevant analytical or technical field, such as statistics, data science, or business analytics.
· 5+ years as a Tableau developer and administrator.
· Extensive experience with large data sets, statistical analyses, and visualization as well as hands-on experience on tools (SQL, Tableau, Power BI).
· Ability to quickly learn and take responsibility to deliver.

About Quizizz
Quizizz is one of the fastest-growing EdTech platforms in the world. Our team is on a mission to motivate every student and our learning platform is used by more than 75 million people per month in over 125 countries, including 80% of U.S. schools.
We have phenomenal investors, we’re profitable, and we’re committed to growing and improving every day. If you’re excited about international SaaS and want to build towards a mission that you can be proud of then Quizizz might be a good fit for you.
We currently have offices in India and the U.S. with incredible team members around the world and we hope you’ll join us.
Role
We are looking for an experienced Product Analyst. The role offers an exciting opportunity to shape the future of the product, significantly. The team is responsible for supporting all decisions being taken by other teams, to improve growth, engagement and revenue of the platform. Furthermore, the team sets up and maintains internal tools, apps, dashboards, processes and functions as arbiters of information within the organization.
The variety of tasks is immense and will give you the chance to play to your strengths. Tasks could include; improving search and recommendations, data mining, identifying potential customers, ad-hoc analyses, creating APIs for internal consumption et cetera.
Some of the challenges you will face include:
- Working cross-functionally with design, engineering, sales and marketing teams to aid in decision making.
- Analyze and conclude experiments of new product features.
- Creating, maintaining, and modifying internal dashboards, apps and reports being used, as part of the larger analytics function at Quizizz.
- Deep diving into data to extract insights that could help explain a certain phenomenon.
- Organizing the analytics warehouse, as and when new data is added.
Requirements:
- At least 2 years of industry experience, providing solutions to business problems in a cross-functional team.
- Versatility to communicate clearly with both technical and non-technical audiences.
- An SQL expert, and strong programming skills. (Python preferred)
- Mathematical thinking.
- Attention to Detail.
Good to have:
- Experience with Jupyter (/iPython) notebooks.
- Experience using a data visualization tool such as Tableau, Google Data Studio, Qlikview, Power BI, RShiny.
- Ability to create simple data apps/APIs. (we use flask or node.js)
- Knowledge of Natural Language Processing techniques.
- Data analytical and data engineering experience.
Benefits:
At Quizizz, we have built a world-class team of talented individuals. While we all care deeply about our work, we also ensure that we maintain a healthy work-life balance. Our policies are designed to ensure the well-being and comfort of our employees. Some of the benefits we offer include:
- Healthy work-life balance. Put in your 8 hours, and enjoy the rest of your day.
- Flexible leave policy. Take time off when you need it.
- Comprehensive health coverage of Rs. 6 lakhs, covering the employee and their parents, spouse and children. Pre-existing conditions are covered from day 1, and also benefits like free doctor consultations and more.
- Relocation support including travel and accommodation, and we'll also pay for a broker to find your home in Bangalore!
- Rs. 20,000 annual health and wellness allowance.
- Professional development support. We will reimburse you for relevant courses and books that you need to become a better professional.
- Delicious Meals including breakfast and lunch served at office, and a fully-stocked pantry for all your snacking needs.
At SpringML, we are all about empowering the ‘doers’ in companies to make smarter decisions
with their data. Our predictive analytics products and solutions apply machine learning to
today’s most pressing business problems so customers get insights they can trust to drive
business growth.
We are a tight-knit, friendly team of passionate and driven people who are dedicated to
learning, get excited to solve tough problems and like seeing results, fast. Our core values
include putting our customers first, empathy and transparency, and innovation. We are a
team with a focus on individual responsibility, rapid personal growth, and execution. If you
share similar traits, we want you on our team.
What’s the opportunity?
SpringML is looking for a Top-notch Salesforce Tableau CRM expert. You will play a critical role
in our client engagements using the Salesforce Analytics Cloud platform (Tableau CRM,Einstein Discovery).
You will design and implement highly customized solutions for our customer's business
problems, typically across multiple functions of a customer's organization through data
integration, visualization, and analysis.
Responsibilities:
Translate Business needs to technical specifications
Design and Deploy the Dataflows/Recipes as per the business requirements
Maintain/Support the existing Analytics platforms built using Tableau CRM
Conduct Unit testing and troubleshooting the issues
Create Visualizations using Native Built-in features/SAQL
Implement best security practices
Design the dashboards/Recipes using best practices
Develop/update the technical documentation

What you will do:
- Bringing data to life via Historical and Real Time Dashboard like:
2. Transaction behavior analytics
3. User level analytics
4. Propensity models and personalization models, eg whats the best product or offer to drive a sale
5. Emails/ SMS/ AN – analytics models getting data from platforms like Netcore+ Branch+ Internal Tables
6. Other reports that are relevant to know what is working, gaps, etc
- Monitoring key metrics such as commission, gross margin, conversion, customer acquisitions etc
- Using the data models and reports to draw actionable and meaningful insights. Based on the insights helping drive the strategy, optimization opportunities, product improvement and more
- Demonstrating examples where data interpretation led to improvement in core business outcomes like better conversion, better ROI from Ad spends, improvements in product etc.
- Digging into data to identify opportunities or problems and translating them into easy-to-understand way for all key business teams
- Working closely with various business stakeholders: Marketing, Customer Insights, Growth and Product teams
- Ensuring effective and timely delivery of reports and insights that analyze business functions and key operations and performance metrics
What you need to have:
- Minimum 2 years of data analytics and interpretation experience
- Proven experience to show how data analytics shaped strategy, marketing and product. Should have multiple examples of this based on current and past experience
- Strong Data Engineering skills using tools like SQL, Python, Tableau, Power BI, Advanced Excel, PowerQuery etc
- Strong marketing acumen to be able to translate data into marketing outcomes
- Good understanding of Google Analytics, Google AdWords, Facebook Ads, CleverTap and other analytics tools
- Familiarity with data sources like Branch/ Clevertap / Firebase, Big Query, Internal Tables (User/ Click/ Transaction/ Events etc)
Tableau Developer |
Tableau Prep-MUST+Tableau+SQL |
Must have expereince in Tableau Prep
miimum 4 years of expereince
should have expereince in SQL





- Involvement in the overall application lifecycle
- Design and develop software applications in Scala and Spark
- Understand business requirements and convert them to technical solutions
- Rest API design, implementation, and integration
- Collaborate with Frontend developers and provide mentorship for Junior engineers in the team
- An interest and preferably working experience in agile development methodologies
- A team player, eager to invest in personal and team growth
- Staying up to date with cutting edge technologies and best practices
- Advocate for improvements to product quality, security, and performance
Desired Skills and Experience
- Minimum 1.5+ years of development experience in Scala / Java language
- Strong understanding of the development cycle, programming techniques, and tools.
- Strong problem solving and verbal and written communication skills.
- Experience in working with web development using J2EE or similar frameworks
- Experience in developing REST API’s
- BE in Computer Science
- Experience with Akka or Micro services is a plus
- Experience with Big data technologies like Spark/Hadoop is a plus company offers very competitive compensation packages commensurate with your experience. We offer full benefits, continual career & compensation growth, and many other perks.


